video
2dn
video2dn
Найти
Сохранить видео с ютуба
Категории
Музыка
Кино и Анимация
Автомобили
Животные
Спорт
Путешествия
Игры
Люди и Блоги
Юмор
Развлечения
Новости и Политика
Howto и Стиль
Diy своими руками
Образование
Наука и Технологии
Некоммерческие Организации
О сайте
Видео ютуба по тегу Dataset Distillation
Dataset Distillation by Matching Training Trajectories | CVPR 2022
[TMLR 2024] Audio-Visual Dataset Distillation
[ICCV 2025] Heavy Labels Out! Dataset Distillation with Label Space Lightening
DataDAM: Efficient Dataset Distillation with Attention Matching
workshop@ECCV Data-Efficient Generation for Dataset Distillation
CVPR 25 Towards Universal Dataset Distillation via Task-Driven Diffusion
Flow Map Distillation Without Data (Nov 2025)
[ECCV 2024] Teddy: Efficient Large-Scale Dataset Distillation via Taylor-Approximated Matching
[ICCV 2025 Highlight] Dataset Distillation via Vision-Language Category Prototype
Generalizing Dataset Distillation via Deep Generative Prior | CVPR 2023
Diversity-Enhanced Distribution Alignment for Dataset Distillation
ICLR2025 Breaking Class Barriers: Efficient Dataset Distillation via Inter-class Feature Compensator
Knowledge Distillation and Dataset Distillation of Large Language Models: Emerging Trends, Challen
SelMatch: Effectively Scaling Up Dataset Distillation via Selection-Based Initialization and Partial
CVPR 2025 Paper: Distilling Long-tailed Datasets
[3D-DLAD-v4] Navya 3D Segmentation Dataset for large scale semantic segmentation, Alexandre Almin
Dataset Distillation with Neural Characteristic Function A Minmax PerspectiveSJTU 2025
Vision-Language Dataset Distillation
DISL. Joint Reading Group: Textual Dataset Distillation via Language Model Embedding
Few-Shot Dataset Distillation via Translative Pre-Training
Data Free Knowledge Distillation via Feature Exchange and Activation Region Constraint
DISL Review: Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory Matching.
[ICML 2024] Low-Rank Similarity Mining for Multimodal Dataset Distillation
INNS Webinar Series: Dataset Distillation and Pruning: Streamlining Machine Learning Performance
Model Distillation: Same LLM Power but 3240x Smaller
Следующая страница»